AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
3B Parameter Optimization

# 3B Parameter Optimization

Ministral 3b Instruct
Apache-2.0
Ministral is a small-scale language model series based on the Mistral architecture, with a parameter size of 3 billion, primarily designed for English text generation tasks.
Large Language Model Transformers English
M
ministral
15.89k
53
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase